194 research outputs found

    A highly-ionized absorber as a new explanation for the spectral changes during dips from X-ray binaries

    Full text link
    Until now, the spectral changes observed from persistent to dipping intervals in dipping low-mass X-ray binaries were explained by invoking progressive and partial covering of an extended emission region. Here, we propose a novel and simpler way to explain these spectral changes, which does not require any partial covering and hence any extended corona, and further has the advantage of explaining self-consistently the spectral changes both in the continuum and the narrow absorption lines that are now revealed by XMM-Newton. In 4U 1323-62, we detect Fe XXV and Fe XXVI absorption lines and model them for the first time by including a complete photo-ionized absorber model rather than individual Gaussian profiles. We demonstrate that the spectral changes both in the continuum and the lines can be simply modeled by variations in the properties of the ionized absorber. From persistent to dipping the photo-ionization parameter decreases while the equivalent hydrogen column density of the ionized absorber increases. In a recent work (see Diaz Trigo et al. in these proceedings), we show that our new approach can be successfully applied to all the other dipping sources that have been observed by XMM-Newton.Comment: 5 pages, 5 figures, to appear in the proceedings of "The X-ray Universe 2005", San Lorenzo de El Escorial (Spain), 26-30 September 200

    Long-term and seasonal variations in CO2: linkages to catchment alkalinity generation

    Get PDF
    International audienceAs atmospheric emissions of S have declined in the Northern Hemisphere, there has been an expectation of increased pH and alkalinity in streams believed to have been acidified by excess S and N. Many streams and lakes have not recovered. Evidence from East Bear Brook in Maine, USA and modelling with the groundwater acid-base model MAGIC (Cosby et al. 1985a,b) indicate that seasonal and yearly variations in soil PCO2 are adequate to enhance or even reverse acid-base (alkalinity) changes anticipated from modest decreases of SO4 in surface waters. Alkalinity is generated in the soil by exchange of H+ from dissociation of H2CO3, which in turn is derived from the dissolving of soil CO2. The variation in soil PCO2 produces an alkalinity variation of up to 15 meq L-1 in stream water. Detecting and relating increases in alkalinity to decreases in stream SO4 are significantly more difficult in the short term because of this effect. For example, modelled alkalinity recovery at Bear Brook due to a decline of 20 meq SO4 L-1 in soil solution is compensated by a decline from 0.4 to 0.2% for soil air PCO2. This compensation ability decays over time as base saturation declines. Variable PCO2 has less effect in more acidic soils. Short-term decreases of PCO2 below the long-term average value produce short-term decreases in alkalinity, whereas short-term increases in PCO2 produce short-term alkalization. Trend analysis for detecting recovery of streams and lakes from acidification after reduced atmospheric emissions will require a longer monitoring period for statistical significance than previously appreciated. Keywords: CO2 , alkalinity, acidification, recovery, soils, climate chang

    Physics, Topology, Logic and Computation: A Rosetta Stone

    Full text link
    In physics, Feynman diagrams are used to reason about quantum processes. In the 1980s, it became clear that underlying these diagrams is a powerful analogy between quantum physics and topology: namely, a linear operator behaves very much like a "cobordism". Similar diagrams can be used to reason about logic, where they represent proofs, and computation, where they represent programs. With the rise of interest in quantum cryptography and quantum computation, it became clear that there is extensive network of analogies between physics, topology, logic and computation. In this expository paper, we make some of these analogies precise using the concept of "closed symmetric monoidal category". We assume no prior knowledge of category theory, proof theory or computer science.Comment: 73 pages, 8 encapsulated postscript figure

    Measurement of the Charged Multiplicities in b, c and Light Quark Events from Z0 Decays

    Full text link
    Average charged multiplicities have been measured separately in bb, cc and light quark (u,d,su,d,s) events from Z0Z^0 decays measured in the SLD experiment. Impact parameters of charged tracks were used to select enriched samples of bb and light quark events, and reconstructed charmed mesons were used to select cc quark events. We measured the charged multiplicities: nˉuds=20.21±0.10(stat.)±0.22(syst.)\bar{n}_{uds} = 20.21 \pm 0.10 (\rm{stat.})\pm 0.22(\rm{syst.}), nˉc=21.28±0.46(stat.)−0.36+0.41(syst.)\bar{n}_{c} = 21.28 \pm 0.46(\rm{stat.}) ^{+0.41}_{-0.36}(\rm{syst.}) nˉb=23.14±0.10(stat.)−0.37+0.38(syst.)\bar{n}_{b} = 23.14 \pm 0.10(\rm{stat.}) ^{+0.38}_{-0.37}(\rm{syst.}), from which we derived the differences between the total average charged multiplicities of cc or bb quark events and light quark events: Δnˉc=1.07±0.47(stat.)−0.30+0.36(syst.)\Delta \bar{n}_c = 1.07 \pm 0.47(\rm{stat.})^{+0.36}_{-0.30}(\rm{syst.}) and Δnˉb=2.93±0.14(stat.)−0.29+0.30(syst.)\Delta \bar{n}_b = 2.93 \pm 0.14(\rm{stat.})^{+0.30}_{-0.29}(\rm{syst.}). We compared these measurements with those at lower center-of-mass energies and with perturbative QCD predictions. These combined results are in agreement with the QCD expectations and disfavor the hypothesis of flavor-independent fragmentation.Comment: 19 pages LaTex, 4 EPS figures, to appear in Physics Letters

    Integrating sequence and array data to create an improved 1000 Genomes Project haplotype reference panel

    Get PDF
    A major use of the 1000 Genomes Project (1000GP) data is genotype imputation in genome-wide association studies (GWAS). Here we develop a method to estimate haplotypes from low-coverage sequencing data that can take advantage of single-nucleotide polymorphism (SNP) microarray genotypes on the same samples. First the SNP array data are phased to build a backbone (or 'scaffold') of haplotypes across each chromosome. We then phase the sequence data 'onto' this haplotype scaffold. This approach can take advantage of relatedness between sequenced and non-sequenced samples to improve accuracy. We use this method to create a new 1000GP haplotype reference set for use by the human genetic community. Using a set of validation genotypes at SNP and bi-allelic indels we show that these haplotypes have lower genotype discordance and improved imputation performance into downstream GWAS samples, especially at low-frequency variants. © 2014 Macmillan Publishers Limited. All rights reserved
    • 

    corecore